On the rate of convergence of alternating minimization for non-smooth non-strongly convex optimization in Banach spaces

نویسندگان

چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Family of SQA Methods for Non-Smooth Non-Strongly Convex Minimization with Provable Convergence Guarantees

We propose a family of sequential quadratic approximation (SQA) methods, the inexact regularized proximal Newton (IRPN) method, to minimize a sum of smooth and non-smooth convex functions. Our proposed algorithm features its strong convergence guarantees even when applied to problems with degenerate solutions, while allowing the inner minimization to be solved inexactly. We prove that IRPN conv...

متن کامل

Mangasarian-Fromovitz and Zangwill Conditions For Non-Smooth Infinite Optimization problems in Banach Spaces

In this paper we study optimization problems with infinite many inequality constraints on a Banach space where the objective function and the binding constraints are Lipschitz near the optimal solution. Necessary optimality conditions and constraint qualifications in terms of Michel-Penot subdifferential are given.

متن کامل

Relaxed Majorization-Minimization for Non-Smooth and Non-Convex Optimization

We propose a new majorization-minimization (MM) method for non-smooth and non-convex programs, which is general enough to include the existing MM methods. Besides the local majorization condition, we only require that the difference between the directional derivatives of the objective function and its surrogate function vanishes when the number of iterations approaches infinity, which is a very...

متن کامل

A forward-backward splitting algorithm for the minimization of non-smooth convex functionals in Banach space

We consider the task of computing an approximate minimizer of the sum of a smooth and non-smooth convex functional, respectively, in Banach space. Motivated by the classical forward-backward splitting method for the subgradients in Hilbert space, we propose a generalization which involves the iterative solution of simpler subproblems. Descent and convergence properties of this new algorithm are...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Optimization Letters

سال: 2021

ISSN: 1862-4472,1862-4480

DOI: 10.1007/s11590-021-01753-w